Managing GitHub Copilot Security and Legal Risks with FOSSA

Logo
Presented by

Jessica Black, Senior Software Engineer (FOSSA)

About this talk

Generative AI tools like GitHub Copilot have become increasingly popular in software development, and for good reason. Multiple reports have shown that tools like Copilot can significantly improve development efficiency and increase developer satisfaction, among other benefits. At the same time, some engineering organizations have understandably been reluctant to adopt generative AI tools because of uncertainty around potential security, legal, and data privacy risks. Over the past months, FOSSA has been working to develop product functionality to help our customers manage these potential risks. Learn about these new features — and get big-picture guidance on generative AI risk-management best practices and processes — in this webinar with Senior Software Engineer Jessica Black, who is leading FOSSA’s generative AI risk management feature development. Jessica will discuss: -Design principles behind FOSSA’s new generative AI risk-management features -How to use FOSSA’s generative AI risk-management features to understand and manage security and legal risks -Strategies to improve maintainability and code privacy when using generative AI code-generation tools -GitHub Copilot settings that can guard against potential legal and privacy risks
Related topics:

More from this channel

Upcoming talks (1)
On-demand talks (59)
Subscribers (6239)
Up to 90% of any piece of software is from open source, creating countless dependencies and areas of risk to manage. FOSSA is the most reliable automated policy engine for vulnerability management, license compliance, and code quality across the open source stack. With FOSSA, engineering, security, and legal teams all get complete and continuous risk mitigation for the entire software supply chain, integrated into each of their existing workflows.